Stand Up for Privacy
Yesterday, on June 16, the Taskforce on Innovation, Growth and Regulatory Reform published a report outlining their vision to “refresh the UK’s approach to regulation now that we have left the EU”.
What’s being presented as an “independent report” has been written by a taskforce of three conservative MPs, selected by a conservative Prime Minister to advise a conservative Government in the pursuit of their deregulatory agenda.
Yet, the Report is significant, because it signals the Government’s desire to gut GDPR and your privacy rights. Some of these suggestions are likely to appear as policy proposals in the near future.
Open Rights Group reviewed their proposal to “Replace GDPR with a new UK framework for data protection”: it is only six pages, but riddled with blatant lies and an obtuse interpretation of the few facts presented to bless potential Government proposals.
See no evil, hear no evil, speak rubbish
The Report builds its case to deregulate GDPR to “accelerate growth in the digital economy, and improve productivity and people’s lives by freeing them up from onerous compliance requirements”. In doing so, they depict the GDPR as “prescriptive and inflexible and particularly onerous for smaller companies and charities to operate”.
However, onerous compliance requirements on charities, SMEs and voluntary organisations are a direct result of poor enforcement, not GDPR itself. For instance, if your cloud service makes sensitive information available to US law enforcement authorities or advertisers, you will have a hard time complying with legal obligations. This is where the Information Commissioner’s Office should come into play, and ensure that unsafe products are removed from the market until they meet safety standards — in essence the role of a Regulator. The ICO, however, is being held by Government’s unsavoury obsession with “business friendly attitude” and “growth-oriented regulatory approaches”, resulting in a slow and patchy enforcement that avoids the difficult questions.
The deficiencies from the lack of ICO enforcement are even more evident in the second example that the Report raises against the GDPR, namely the fact that tech giants are bombarding users with cookie banners and “complex consent requests”. Open Rights Group and others lodged a complaint against online advertisement practices back in 2018. The ICO found the adtech industry to be operating illegally already in that year, but then got lost in a long process of “industry engagement” and pats on the shoulders that left these illegal practices unchallenged.
A properly enforced GDPR would soon get rid of cookie banners, as they stand in front of an industry that is operating unlawfully, and do not provide meaningful consent. We know that, when asked simply, around 96% of Apple’s users declined to be tracked. Users don’t want to be pestered by Cookie Walls, and they also don’t want to be tracked. The Report’s solution, however, is to remove consent entirely.
Government never learn their lessons
The Report reaches its peak in scary, incoherent ideas when proposing to scrap article 22 from the GDPR, and eliminate human review from automatic-decision making, because it “makes it burdensome, costly and impractical for organisations to use AI to automate routine processes”.
It is embarrassing to have to write this, but there are indeed good reasons why human review is required for algorithmic decision making. First of all, this requirement does not apply to any use of AI, but only to decisions that produce legal effects or otherwise significantly affect individuals. If a decision that produces a legal obligation wasn’t reviewable, AI would effectively become a backdoor to justify any kind of abuse: think of a company claiming a debt against you based on an automated decision that cannot be understood, or Government denying benefits to an individual based on criteria that cannot be scrutinised.
The stated intent to get rid of article 22 is even more worrying if you put it into perspective. With the A-level fiasco, an entire generation hit the streets chanting “fuck the algorithm”, following the unfair automated computation of their exams results. Now that the dust has settled, Government seem committed to ensure that their next mistake does not meet any future scrutiny. Scrapping article 22 would remove the only safeguard that stood between UK students and an unfair denial of their once-in-a-life chance to enter the University.
This Report concludes by emphasising that “It is particularly important that changes are made to permit automated decision-making for machine learning and to remove the human review of algorithmic decisions required by GDPR”. The Report’s insistence on this point may not be only a display of grievance against a rule that holds Government accountable, but may very well be a carefully calculated step. In the Netherlands, Government had to resign following a report that showed how automated-decision making for detecting tax fraud was being used in an openly discriminatory manner against migrants. This may be worrying the UK Government as well, which is planning to introduce new powers for the Cabinet Office to do data matching and detecting frauds as well as other crimes.
Government cannot be trusted
There is a final aspect of the reasoning around data protection obstacles to Artificial Intelligence that needs to be raised. In describing how GDPR “hinders” AI development, they go as far as stating that
“Article 5 of GDPR requires data be “collected for specified, explicit and legitimate purposes,” and “adequate, relevant and limited to what is necessary”. These restrictions limit AI because they prevent AI organisations from collecting new data before they understand its potential value and they also mean that existing data cannot be reused for novel purposes”.
As a matter of fact, there are provisions in the GDPR that allow to further process data for statistical and other research purposes, provided that additional safeguards are put into place. However, this paragraph finally dropped the act, and showed the intention not to reform but to cancel data protection altogether. Purpose and storage limitation are the cornerstone of any safeguards for individuals against abuses. If I am providing my information to the NHS for contact tracing and this information is being passed on to the authorities or to my employer to check if I took part to a strike, this is an abuse, and it unequivocally puts me at risk.
It is worth noting that purpose and storage limitation aren’t legal requirements under the GDPR alone. The European Convention on Human Rights stipulates the prohibition from interfering with one’s personal life in a manner that is not necessary or proportionate, of which purpose and storage limitation are a clear emanation. Furthermore, the Council of Europe data protection convention also provides that data can be “collected for explicit, specified and legitimate purposes and not processed in a way incompatible with those purposes”, and “preserved … for no longer than is necessary for the purposes for which those data are processed”.
Thus, these plans wouldn’t only conflict with the very notion of data protection rights; it would also mark yet another instance where the UK Government went rogue on the international stage, and wilfully disregarded international obligations they previously committed to — in this case, by signing the European Convention on Human Rights and ratifying the CoE data protection convention. This proposed approach should ring alarm bells in the EU, which is trying to establish a regime for the free flow of personal data based on UK adherence to these obligations.
Government agenda cannot be driven by grievance
Government spent the last year breaking every possible data protection laws in their response to Coronavirus, and now explains the need to get rid of data protection’s “onerous prescriptions”. Government stumbled over A-level exam results and now seeks to remove the provisions that would have held them accountable if they insisted. Finally, Government is doing all of that to mark the opportunities that departing from the EU block allegedly created.
In doing so, the Reports’ authors and the Government seem committed to steamroll any reasonable consideration: human rights, our safety, and UK reputation on the international stage. If they are allowed to proceed, they will not benefit UK business, but rather help seed unethical and discriminatory business practices that will undermine trust not just in digital industries, but also between citizens and society.
Grievance should not and cannot be the basis for UK policy making. Decisions that will affect our rights and the society we live cannot be based on reports filled with lies, omissions, and misrepresentations.
In other words, this Report ought be repudiated by MPs and Government. Its proposals deserve to be buried, but we fear may surface through legislative proposals all too quickly. Please join our mailing list to stay in touch and help us fight these appalling proposals.
Hear the latest
Sign up to receive updates about Open Rights Group’s work to protect digital rights.
Subscribe